- Title
- APIRecX: Cross-Library API Recommendation via Pre-Trained Language Model
- Creator
- Kang, Yuning; Wang, Zan; Zhang, Hongyu; Chen, Junjie; You, Hanmo
- Relation
- EMNLP 2021 The 2021 Conference on Empirical Methods in Natural Language Processing. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (Punta Cana, Dominican Republic 07-11 November, 2021) p. 3425-3436
- Publisher Link
- http://dx.doi.org/10.18653/v1/2021.emnlp-main.275
- Publisher
- Association for Computational Linguistics
- Resource Type
- conference paper
- Date
- 2021
- Description
- For programmers, learning the usage of APIs (Application Programming Interfaces) of a software library is important yet difficult. API recommendation tools can help developers use APIs by recommending which APIs to be used next given the APIs that have been written. Traditionally, language models such as N-gram are applied to API recommendation. However, because the software libraries keep changing and new libraries keep emerging, new APIs are common. These new APIs can be seen as OOV (out of vocabulary) words and cannot be handled well by existing API recommendation approaches due to the lack of training data. In this paper, we propose APIRecX, the first cross-library API recommendation approach, which uses BPE to split each API call in each API sequence and pre-trains a GPT-based language model. It then recommends APIs by fine-tuning the pre-trained model. APIRecX can migrate the knowledge of existing libraries to a new library, and can recommend APIs that are previously regarded as OOV. We evaluate APIRecX on six libraries and the results confirm its effectiveness by comparing with two typical API recommendation approaches.
- Subject
- application programming interfaces (API); libraries; computational linguistics
- Identifier
- http://hdl.handle.net/1959.13/1446377
- Identifier
- uon:42850
- Identifier
- ISBN:9781955917094
- Language
- eng
- Reviewed
- Hits: 1293
- Visitors: 1290
- Downloads: 1
Thumbnail | File | Description | Size | Format |
---|